Jackknife and Bootstrap Resampling Methods in Statistical Analysis to Correct for Bias

نویسنده

  • Peter Young
چکیده

This result is obvious but also useful because it tells us that the sample mean is an unbiased estimate for the exact mean, in the sense that the average of the sample mean, over many repetitions, is the exact mean. An unbiased estimate will become more and more accurate as the number of data points is increased. However, a biased estimate not continue to improve with increasing N once the error is smaller than the bias. Hence we should work with unbiased estimators. We will also be interested in the variance of the sample mean (again averaged over many repetitions). We find that

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

روش‌های بازنمونه‌گیری بوت استرپ و جک نایف در تحلیل بقای بیماران مبتلا به تالاسمی ماژور

Background and Objectives: A small sample size can influence the results of statistical analysis. A reduction in the sample size may happen due to different reasons, such as loss of information, i.e. existing missing value in some variables. This study aimed to apply bootstrap and jackknife resampling methods in survival analysis of thalassemia major patients. Methods: In this historical coh...

متن کامل

Bias Correction with Jackknife, Bootstrap, and Taylor Series

We analyze the bias correction methods using jackknife, bootstrap, and Taylor series. We focus on the binomial model, and consider the problem of bias correction for estimating f(p), where f ∈ C[0, 1] is arbitrary. We characterize the supremum norm of the bias of general jackknife and bootstrap estimators for any continuous functions, and demonstrate the in deleted jackknife, different values o...

متن کامل

Jackknife Empirical Likelihood Inference For The Pietra Ratio

Pietra ratio (Pietra index), also known as Robin Hood index, Schutz coefficient (RicciSchutz index) or half the relative mean deviation, is a good measure of statistical heterogeneity in the context of positive-valued data sets. In this thesis, two novel methods namely “adjusted jackknife empirical likelihood” and “extended jackknife empirical likelihood” are developed from the jackknife empiri...

متن کامل

Unbiasing the Bootstrap—Bootknife Sampling vs. Smoothing

Bootstrap standard errors are generally biased downward, which is a primary reason that traditional bootstrap confidence intervals have coverage probability which is too low. For the sample mean the downward bias is a factor of n−1 n (for the squared standard error); the same bias holds approximately for asymptotically-linear statistics. In the case of stratified or two-sample bootstrapping, th...

متن کامل

A jackknife type approach to statistical model selection

Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown tr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010